专利摘要:
techniques for signaling content limits methods and devices provide a user interface that provides visual cues when a panning or scrolling document has reached the end or limit by distorting the document image in response to other user inputs. the distortion function of images may include shrinkage, expansion, accordion-shaped expansion or continuous contact with a document image. the degree of image distortion can be proportional to the distance over which a user input would move the document beyond the limit found. when the limit of a document image is reached during panning or fast scrolling, a continuous contact image distortion can be applied to the document image in order to inform the user that the document has reached a limit during the movement.
公开号:BR112012008792B1
申请号:R112012008792-4
申请日:2010-10-04
公开日:2021-03-23
发明作者:Diego A. Wilson;Sean S. Rogers;Per O. Nielsen
申请人:Qualcomm Incorporated;
IPC主号:
专利说明:

[0001] [0001] The present invention relates generally to computer user interface systems and, more specifically, to user systems that provide image distortion functionality. Foundations
[0002] [0002] Personal computing devices (cell phones, PDAs, laptops, gaming devices, for example) provide users with increasing functionality and data storage. Personal computing devices function as personal organizers, storage for documents, photos, videos and music, as well as portals to the Internet and electronic mail. In order to fit on the small screens of such devices, large documents (photographs, music files and contact lists, for example) are typically displayed on a display that can present a portion of the document in a size large enough to be read and be controlled by scrolling or panning functions to reveal other parts. In order to view all or parts of a document image or sort through a list of digital files, typical graphical user interfaces allow users to use panning or panning functions by making unidirectional movements of a finger on a screen monitor touch sensitive, as implemented on mobile devices, such as the Blackberry Storm®. However, because of the small size of the display on portable computing devices, the entire content of the document's images generally does not fit the screen. Since only a portion of a large document is usually displayed on the display of a computing device, users can lose their position and orientation with respect to the entire document. As such, users may not be able to perceive the limits of a document image based on the part of the document image that is displayed on the display. summary
[0003] [0003] The various aspects include methods for implementing a user interface function on a computing device, which include detecting an entry by gesture of document movement on a user interface device (a touch event on a surface sensitive to touch, for example) of the computing device, determine the movement of the document image of a document displayed on the computing device based on motion gesture input, determine whether a document boundary is reached based on the determined movement of the image of document, and distort the document image displayed on the computing device when a document image limit is reached. Distorting the document image can involve one of stretching the document image, shrinking the document image and bouncing the document image. Distorting the document image can include distorting the document image along a selected axis from a horizontal axis, from a vertical axis, from a diagonal axis, and from both horizontal and vertical axes. Distorting the document image can include distorting the document image in an accordion manner by inserting a space between display image elements without distorting the display image elements. Aspect methods can be implemented on a computing device in which the user interface device is a touch sensitive surface and the movement gesture input of the document image is a touch event detected on the touch sensitive surface. Aspect methods can be implemented on a computing device on which the user interface device is a touch sensitive surface, and motion input of the document image is a touch event detected on a touch screen . Distorting the document image image may include distorting the document image based on the distance the touch event travels after the document boundary is reached. In another aspect, the methods may also include starting a panning or fast scrolling animation if the user input represents a flick gesture, determining whether the end of panning or fast scrolling has been achieved and animating the distortion of the document image if the document image limit is reached before the end of panning or fast scrolling animation is reached. Such animation of the distortion of the document image can include animating a movement of continuous contact (bouncing) of the document image or animating a movement of continuous contact and distortion by compression of the document image. In another aspect, the methods may also include determining a distortion function to be applied to distort the document's image and calculating a distortion factor based on the distance by which the motion gesture input of the document's image would move that the event touch of the document image travels after the limit is reached, in which distorting the document image includes applying the determined distortion function to the display image based on the calculated distortion factor. This aspect may also include determining whether the maximum level of display image distortion is achieved, and reverting the display image back to its original shape if the maximum level of display image distortion is achieved.
[0004] [0004] Another aspect is a computing device that includes a processor, a monitor attached to the processor and a user interface device (such as, for example, a touch-sensitive surface, a touch-sensitive screen, a computer mouse, a trackball, etc.) coupled to the processor, in which the processor is configured with executable instructions per processor to perform the operations of the methods of the various aspects.
[0005] [0005] Another aspect is a computing device that includes mechanisms to perform the functions involved in the operation of the methods of the various aspects.
[0006] [0006] Another aspect is a computer-readable storage medium that has computer-executable instructions stored in it, configured to make the computer execute the processes of the methods of the various aspects. Brief Description of Drawings
[0007] [0007] The attached drawings, which are incorporated herein and form part of this report, show exemplary aspects of the invention. Together with the general description presented above and the detailed description presented below, the drawings serve to explain features of the invention.
[0008] [0008] Figure 1A - is a front view of a portable computing device that illustrates a list document on a touchscreen.
[0009] [0009] Figure 1B - is a front view of a portable computing device that illustrates an end-of-list distortion functionality activated by a finger that moves upwards on a touchscreen according to an aspect.
[0010] [0010] Figures 2A and 2B - are front views of a portable computing device that illustrate a distortion functionality activated by a finger that moves upwards on a touchscreen according to another aspect.
[0011] [0011] Figure 3 - is a process flow diagram of an aspect method to apply a distortion function to the display image.
[0012] [0012] Figure 4 - is a process flow diagram of a method of the aspects to apply a distortion function by continuous contact to the display image.
[0013] [0013] Figure 5 - is a block diagram of components of an exemplary mobile computing device suitable for use with the various aspects.
[0014] [0014] Figure 6 - is a block diagram of components of an exemplary portable computing device suitable for use with the various aspects. Detailed Description
[0015] [0015] The various aspects will be described in detail with reference to the attached drawings. Whenever possible, the same reference numbers will be used in all drawings to refer to the same or similar parts. References made to specific examples and implementations are for illustrative purposes and are not intended to limit the scope of the invention or the claims.
[0016] [0016] The word "exemplary" is used here to mean "that serves as an example, occurrence or illustration". Any implementation described here as "exemplary" should not necessarily be interpreted as preferred or advantageous compared to other implementations.
[0017] [0017] As used herein, the terms "personal electronic device", "computing device" and "portable computing device" refer to any or all cell phones, personal data assistants (PDAs), palmtop computers, notebook computers, personal computers, wireless e-mail receivers and cell phone receivers (Blackberry® and Treo® devices, for example), multimedia Internet enabled mobile phones (Blackberry Storm®, for example) and similar electronic devices that they include a programmable processor, a memory and a connected or integral touch-sensitive surface or other pointing device (a computer mouse, for example). In an exemplary aspect to illustrate various aspects of the present invention, the computing device is a cell phone that includes an integral touchscreen. However, this aspect is present merely as an exemplary implementation of the various aspects and as such it is not intended to exclude other possible implementations of the object referred to in the claims.
[0018] [0018] As used herein, a "touch screen" is a touch detector input device or a touch sensitive input device with a connected image monitor. As used herein, a "touch table" is a touch detector input device without a connected image monitor. A touchscreen, for example, can be deployed on any surface of a computing device outside the image display area. Touch-sensitive screens and touch-sensitive tables are generally referred to here as "touch-sensitive surface". Touch-sensitive surfaces can be integral parts of a computing device, such as a touch-sensitive screen or a separate module, such as a touch-sensitive table, that can be attached to the computing device via a data link wired or wireless. The terms touch screen, touch table and touch surface can be used interchangeably below.
[0019] [0019] As used herein, a "touch event" refers to a user entry detected on a touch sensitive surface that may include information regarding the location or relative location of the touch. For example, on a touchscreen device or touchscreen user interface, a touch event refers to the detection of a user touching the device and may include information regarding the location on the device being touched .
[0020] [0020] As used herein, "document" or "digital document" refers to any one of several digital contents in which at least one part can be displayed as an image of a document on the screen of a computing device. Examples of digital documents covered by the term "document" include, but are not limited to, digital photographic files (.tif, .JPG, for example), document files (such as, for example, .pdf, .txt, etc.) , word processing files (such as .doc, .rtf, etc.), graphics and drawing files (.vsd, for example), spreadsheet files (.xls, for example), database files data (such as .xml, .sql, etc.), contact databases (address book application files kept on cell phones), file listings, music library listings, and similar digital lists. Thus, the reference to "document" is not intended to limit the scope of the claims to a written document, but covers all forms of digital information that can be displayed in accordance with the various aspects.
[0021] [0021] As used herein, "document image" refers to the image that a computing device can generate based on the document content for presentation on the screen of the computing device. Examples of document images include, but are not limited to, digital photographs, displayed parts of a text or word processing document (that is, excluding metadata and formatting data), displayed graphics and widgets that present information contained in a bank data or file lists, extended menu lists (a song listing, for example) and the view generated from spreadsheet files.
[0022] [0022] As used herein, "document boundary" refers to any edge, border or boundary within the document beyond which a document image cannot extend. Examples of document limits include, but are not limited to, the beginning of a list, database or text document, the end of a list, database or text document, each edge of a digital photograph and the first and last entries in a database record.
[0023] [0023] Since an image of the entire contents of a document, such as a word processing document, image element or contact list, may not fit entirely on the touchscreen of a computing device, most graphical user interfaces provide scrolling and panning functionality to allow the document image to be moved within the device's viewing window. To allow users to move document images within a touchscreen, a computing device can be configured to detect the tracing of touch gestures on the screen and to move the document image relative to the preview window in response to the direction of touch gestures. Typical document control gestures allowed on touchscreen computing devices include dragging, panning, scrolling and fast moving gestures. In dragging gestures, panning and rolling, the user touches the touch screen and moves the touch finger in the desired direction of movement without lifting the finger. For example, to see the entire contents of a contact list, the content of which does not fit in the display frame of the touchscreen of the computing device, the user can draw a scroll gesture on the touchscreen in order to scroll down through the contact list. The scrolling function allows the user to take different parts of the contact list into the display frame of the computing device and display one part at a time. In a fast-moving gesture, the user touches a touch-sensitive surface (a touchscreen, for example), quickly moves his finger up / down or left / right on the touchscreen and removes the finger in the same movement as if to quickly move the content in a specific direction. It should be understood that a fast-moving gesture (or a user input equivalent to a fast-moving gesture) can be performed on other types of user input devices, such as, for example, by rapidly rotating a scroll wheel in a computer mouse or by swiftly rotating a trackball. Thus, the reference to a fast-moving gesture is not necessarily limited to a fast-moving gesture performed on a touch-sensitive surface (a touch-sensitive screen, for example).
[0024] [0024] Unlike standard personal computers, which track the movement and boundaries of a document image showing a display indicator, such as a scroll bar on the computer screen, many portable computing devices do not include such display indicators. The absence of display indicators on handheld computing device displays frees up valuable display space in which more than one display image can be displayed. However, the user of a computing device who is trying to navigate through a large document (a long contact list, for example) will often not be aware of the document's limits (the top or end of a list, for example). For example, when scrolling down through an email list by tracing a series of scrolling gestures on the touchscreen, the user may not be able to discern where the list starts or ends. When the user reaches the end of the email list ignoring that a final limit is reached, the user can continue to make scrolling gestures in an attempt to continue scrolling through the list. When the list stops responding to scrolling gestures because the end of the list has been reached, the user may wonder if the computing device froze. The only way to confirm that the computing device is working properly is to scroll in the other direction, which is an unnecessary move. Such ambiguous functioning at the end of a list or document boundary can cause frustration and confusion for the user.
[0025] [0025] The methods of the various aspects provide image distortion functionality in the user interface that generates a visual indication or signal to inform users of computing devices when dragging, panning, rolling and moving gestures would quickly move a document beyond the limits of the document. This visual indication includes intuitive image distortion as if the user's gesture was distorting a physical document. The computing device can be configured to detect when the document limit is reached and the user's gesture attempts to move the document beyond the limit, and to distort the display image in accordance with the user's gesture. For example, when the user reaches the end of an email list while tracing a scroll gesture on a touchscreen, the computing device can distort the displayed image of the last email items as if the list were a sheet. of plastic or an accordion-shaped stack of connected items. Thus, the methods of the various aspects generate an intuitive signal for users when their gestures of dragging, panning, scrolling and moving quickly have reached a document limit without having to display scroll bar indicators that take up "space" "display value.
[0026] [0026] Although the various aspects are described with reference examples implemented in computing devices equipped with touchscreen monitors, the aspects and claims are not limited to such implementations. The various aspects can be implemented in any computing device that displays document images that extend beyond the edges of the display window and receive user input from user input devices (such as a touch sensitive surface, a computer mouse, trackball, etc.) to move the document image with respect to the display window. The various aspects provide sufficient intuitive visual cues to eliminate the need for scroll bar indicators, thereby increasing the display space available for displaying the document image. For example, a gesture of moving document images, such as dragging, panning, rolling or moving quickly, can also be performed with the use of a conventional computer mouse or trackball, as is well known. Thus, the examples of executing drag gestures, panning, scrolling and moving quickly on a touchscreen user interface device refer to a particularly advantageous implementation of the aspects and are not intended to limit the scope of claims to user interface devices with touchscreen.
[0027] [0027] Under different aspects, the image distortion functionality can be enabled as part of the graphical user interface (GUI) functionality or as part of an application (a contact list or photo viewer application, for example) ). The automatic activation of the image distortion functionality within the GUI can be provided as part of an application. In addition, the image distortion functionality can be provided with a pre-defined function within a GUI or application.
[0028] [0028] In some ways, the image distortion functionality can be enabled manually. The user can select and enable / enable an icon on a GUI monitor. For example, the activation of the functionality can be attributed to a function key, which the user can activate (by pressing or clicking, for example) to activate the image distortion functionality. As another example, the image distortion functionality can be activated by a user command. For example, the user can use a voice command, such as "activate end-of-list signal" to enable image distortion functionality. Once activated, the image distortion feature can be used in the manner described here.
[0029] [0029] The image distortion functionality can respond to any form of user touch gesture implemented on the touch sensitive surface of the computing device. Since touch screens are usually superimposed on a display image, in a particularly useful implementation the touch surface is a touch screen, thus allowing users to interact with the display image at the touch of a button. finger. In such applications, the user interacts with an image by touching the touchscreen with a finger and tracking routes back and forth or up and down.
[0030] [0030] Different aspects can distort a display image using different types of distortion, such as, for example, shrinking, stretching or rejecting the image. For example, the computing device may distort the display image by uniformly stretching the content (referred to as "flat stretching"), logically stretching part of the content (that is, referred to as "partial stretching"), stretching a part of the image along one direction while shrinking a part of the image along the perpendicular dimension (referred to as "local stretching") or creating an empty space between fixed-sized items (referred to as "accordion-shaped stretching") .
[0031] [0031] Figures 1A and 1B illustrate an end-of-list distortion functionality in which a uniform or flat stretching distortion is applied to a list document in order to inform the user that the end of a contact list has been reached. As shown in Figure 1A, the user can scroll through a contact list by marking the touch screen surface 104 with a finger 108, as shown by the dotted arrow 106. The dotted arrow 106 is shown here only to illustrate the direction of the traced gesture. Since the monitor 102 of computing device 100 does not include a display indicator to allow the user to determine the limits of the contact list, the user may have no way of knowing when the end of the list has been reached.
[0032] [0032] As illustrated in Figure 1B, the computing device 100 can be configured to inform the user that the end of the list is reached by distorting the display image in response to other scrolling gestures by the user. In the illustrated example, the computing device 100 is configured to apply a uniform distortion to the image that evenly stretches the display image along the direction of the scrolling gesture as if the user is stretching an elastic material on which contact information is printed. Such visual distortion of the contact list intuitively informs the user that the end of the list has been reached and that the computing device is working properly, although the list is no longer scrolling.
[0033] [0033] In order to apply distortion by uniform or flat stretching, the computing device 100 can scale the display image evenly. Therefore, distortion factors by uniform or flat stretching can be applied to the display image along the direction of the user's touch gestures. As shown in Figure 1B, a user input gesture that attempts to move or scroll the document image beyond the end or limit of the document can stretch the document along the direction of scrolling. Other forms of distortion can also be applied. For example, a user input gesture that attempts to move or scroll beyond a document boundary can cause a stretch along the axis of the user's touch gesture and a narrowing or shrinking along the perpendicular direction, as may occur. an elastic fabric. In another aspect, the distortion can be applied in an opposite direction so that, if the user input gesture tries to move the document image horizontally on the touchscreen, the image is stretched along its vertical axis ( that is, its height) and, if the user tries to move the document vertically on the touchscreen, the image can be stretched along its horizontal axis (that is, its width). In another aspect, if the user's input gesture tries to move the document in a diagonal scroll gesture (towards the corners of the touch screen, for example), the image can be stretched along both the horizontal axis and the axis vertical proportionally to the angle of the drawn touch gesture.
[0034] [0034] In one respect, the degree to which a display image is stretched is determined by a distortion factor that is calculated based on the movement that would be printed on the document image beyond the limit found by the user's input gesture, such as a touch gesture on a touchscreen. The computing device 100 can be configured to calculate the distortion factor based on the distance at which the user's input gesture (touch event, for example) would have to move the document once a limit is reached. In Figure 1B, for example, computing device 100 can detect the end of the contact list and stretch the contact list based on the distance the user's finger 108 moves on the touchscreen 104 once the end is reached from the contact list. The farther the user's finger 108 moves on the touchscreen 104, the greater the distance factor and the more the contact list image is distorted. When the user finishes the touch event, the stretched display image is automatically reverted back to its original shape.
[0035] [0035] The distortion factor can be applied in order to distort the display image proportionately. When applying a stretch effect to the display image, for example, computing device 100 can be configured to distort the display image proportionally to the distance by which the motion gesture input of the document image would move the document image. beyond the document boundary, such as the distance the touch event travels on the touchscreen 104 after the document boundary is reached. Thus, the longer the distance traveled by the touch event, the greater the distortion of the display image.
[0036] [0036] In one respect, computing device 100 can be configured to limit the extent to which a display image can be distorted. Once the display image reaches its maximum distorted conformation, no further distortion can be applied. A stretched display image can also automatically revert back to its original shape when the user finishes the touch event.
[0037] [0037] The automatic reversal of display images back to their original conformation can occur in different ways. For example, automatic reversal can be an animated reversal of the distortion or an animated under-damping. In an animated reversal, the computing device 100 can be configured to animate the reversal of the image back to its original conformation, as if undoing the stretch.
[0038] [0038] The effect of animated under-damping reflects the behavior of elastic systems. In nature, when elastic systems are released from tension, they often do not stop when they reach their original conformations and instead continue to bounce back and forth around the original conformation before settling into their original states. When an elastic band is stretched and released, for example, the band moves in the direction of its resting state, but does not immediately find its original shape. As the band releases potential energy, the band can first shrink and stretch again a number of times before resuming its original shape. The computing device 100 can be configured to animate the image distortion transition to mimic such natural effects in order to provide the user with the perception of interaction with real-world objects.
[0039] [0039] As mentioned above, a type of distortion of images that can be applied under different aspects is the stretch in the shape of an accordion, which inserts a space between image elements without changing the conformations of the elements themselves. This aspect is illustrated in Figures 2A and 2B. Concertina-shaped distortion creates the appearance of a material stretched between the content elements (graphic widgets, for example) of a document image. The appearance of "sticking" or rubber material between fixed-size image elements can provide the appearance of stretching the entire image without distorting the actual content of the image.
[0040] [0040] Figure 2A illustrates a list of contacts being navigated by the user, who traces scrolling gestures with a finger 108. The user can move the list of contacts upwards by touching the surface 104 of the touchscreen of a computing device. 100 using the finger 108 and tracing an upward scrolling gesture along the dotted arrow 106. Similar to the computing device 100 shown in Figures 1A and 1B, monitor 102 does not include a scroll bar indicator to inform the user about the limits of the contact list. Therefore, the user may not know when the end of the list is reached.
[0041] [0041] Figure 2B shows the distortion behavior of images by stretching the accordion shape of the monitor when the base end of the contact list is reached. Figure 2B illustrates how the distortion by stretching in an accordion shape does not alter the conformation of the display elements of the contact list (that is, contact names, images and telephone numbers). Instead, the image appears to stretch the 112 sticker that is arranged between the display elements of contact data.
[0042] [0042] Other stretching effects can also be applied in the same manner as described above with respect to Figures 1 and 2. In one aspect, a computing device 100 can be configured to apply partial stretching effects to the display image, so that , when a document image limit is reached, only certain parts of the display image are stretched while the rest is unchanged. For example, the part of the display image that is stretched can be the part that lies between the finger touch location 108 and the reached limit of the document image.
[0043] [0043] In another aspect, the display image may appear stretched using a local / logarithmic stretching method. In this style of stretching, the display image can appear stretched in a logarithmic manner to allow the portion that is stretched to move smoothly to stop within the portion of the document image that does not appear stretched.
[0044] [0044] In another aspect, the display image can be distorted by local compression of its content. This style of distortion can be applied, for example, when the entire document image fits within the display frame. As the user tries to move the document image in one direction, the side of the document image in the direction of movement may appear compressed, while the opposite side of the document image may appear stretched. For example, the entire contact list can fit inside the monitor 102 of a computing device 100. If the user tries to move the contact list down by scrolling down on the surface 104 of the touch screen, the contact list content located below the finger touch location 108 can be compressed while content above finger 108 can be stretched.
[0045] [0045] In another aspect, when a document limit is reached during a quick scroll, as can occur following a rapid movement gesture, the image can be distorted in a continuous contact effect to inform the user that a limit of the document image has been reached. A quick scroll typically occurs when the user makes a quick gesture to move quickly through the content of a document (a list, for example) or to rotate a large image in panoramic motion using fewer scrolling gestures. Generally, the length of the roll will depend on the speed and / or distance of the rapid movement gesture. For example, a computing device 100 can be configured to determine the speed of a fast-moving gesture traced on the surface 104 of the touchscreen and, based on the determined speed, calculate an end point for scrolling or panning, such as the number of list elements that must be scrolled or the number of pixels that an image must move relative to the display window of the monitor. A quick scrolling gesture can result in the end of a list or the limit of an image being reached before the calculated end point is reached. To inform the user that the rapid movement gesture has reached the end of a list or the document limit, the computing device 100 can be configured to reject the image inside the monitor in order to inform the user that the limit has been reached. In a continuous contact effect, the displayed image may appear to reach a limit in one direction, reverse direction to a small degree, and then in the reverse direction again before settling at the end of the document (such as with the display of the last item in the list or the part of an image that covers the limit found).
[0046] [0046] In the effect of continuous contact, the computing device 100 can also apply the stretching or shrinking effects described above with respect to Figures 1 and 2. For example, once a limit of a document image is reached during a fast scrolling and a continuous contact effect is applied, a stretch effect can also be applied in the direction of movement of the document image. Thus, if the document image is moving downwards relative to the monitor, when the base limit is reached, the document image can be compacted like a ball that hits the ground, retreat a small distance and revert from it returns to its original conformation (that is, simulating an effect of continuous contact) before reaching a state of rest. In other respects, shrinkage effects and a combination of stretching and shrinkage effects can also be used to simulate effects of continuous contact. The number of indentations and stretching cycles can be configured by an application developer to achieve a desired user experience.
[0047] [0047] In a typical computing device, a processor can receive touch event updates from a touchscreen (or other type of touchscreen) periodically, such as every few milliseconds. When the user touches the touchscreen, the processor can receive a "touch down" event notification on the next touchscreen update and, on each subsequent touchscreen update, receive a location notification of ring until a "touch up" event notification is received. Since the touch location information provided by the touch screen occurs very frequently, the movement of a user's touch across the surface of the touch screen will be detected by the processor as a sequence of touch locations change over time. of time.
[0048] [0048] Figure 3 illustrates a process flow of a 300 method of aspects to apply a distortion function to an image displayed in response to a scroll event beyond a document boundary. In block 302 of method 300, a computing device 100 can be configured to detect a touch event on a touch sensitive surface or user input in another form of user interface device (a computer mouse or trackball, for example) ) and determine whether a touch event was a touch up event (or release of a computer mouse button) on determination block 304. If a touch up event is detected (that is, determination block 304 = "Yes"), computing device 100 can terminate the distortion function at block 306. If the touch event was not a touch up event (ie, determination block 304 = "No"), computing device 100 can determine the movement of the document image that must be caused to occur based on the touch event (or other type of user interface input event) in block 308. This can be accomplished by comparing the location current touch (or other type of user interface entry event) received in block 302 with the previous touch location (or other type of user interface entry event) that can be stored in memory. If the touch location has changed, this may indicate that a drag, scroll, or rapid movement gesture is being implemented to cause a movement in a displayed document (a scroll of a displayed list, for example). In the determination block 310, the computing device 100 can determine whether a limit of the displayed document image has been reached. If a document image limit is not reached (that is, the determination block 310 = "No"), the computing device 100 can perform a normal graphical user interface (GUI) function associated with the determined movement of the document in the block 312 and return to block 302 to receive the next touch location event from the touch screen (or other type of user interface entry event). For example, if a document image limit is not reached and the user is introducing a scroll down function, the computing device 100 can continue to scroll down the document image as long as the user's finger remains in contact the touch screen and continue to move.
[0049] [0049] Once a document image limit is reached (ie, the determination block 310 = "Yes"), the computing device 100 can determine a distortion that must be applied to the displayed image and, if appropriate, calculate the distortion factor to be used in the image distortion. The specific distortion to be applied to the displayed image may depend on the application, the nature of the document, user settings and other information. As discussed above, several different distortion functions can be applied according to different aspects. Thus, an application developer can specify a specific distortion function for specific types of documents. In addition, it can allow users to select a specific distortion function as part of their user settings. Thus, some users may prefer to have more distortion applied to images, so that it is easier to determine when a list or document limit has been reached, while some users may prefer to have a little or no distortion applied. The degree of distortion to be applied can be determined by a distortion factor that can be calculated based on the distance over which the user input gesture (a touch event, for example) would have to move the document image after a document limit is reached, or the distance that an entry gesture (a touch event, for example) would cause a document to move beyond its limit. Thus, the longer the touch event (or other type of user input), the greater the distortion that will be applied to the image. In block 314, the computing device 100 can apply the determined distortion function to the display image according to the calculated distortion factor. For example, if the end of a contact list is reached while the user is making an upward scrolling gesture, the computing device 100 can apply a distortion function, such as a stretch effect, to inform the user that the end of the document image has been reached.
[0050] [0050] The computing device continues to detect touch events (or touch-sensitive computer mouse button presses, etc.) with each touchscreen update in block 302. Thus, the degree of image distortion applied the displayed image will continue to depend on the user's input gesture, such as the location of the user's finger on the touchscreen. If the user scrolls backwards (that is, away from the document boundary) without lifting a finger from the touchscreen (or touchscreen, computer mouse, etc.), the degree of distortion applied to the displayed image in block 314 will be reduced. If the user lifts a finger off the touch screen (or touch surface, computer mouse, etc.), a touch up event will be detected in block 302, which will be determined in determination block 304, inducing the end distortion function in block 306. Ending the distortion function will allow the displayed image to return to normal. In one respect, terminating the distortion function at block 306 may include animating the distortion release.
[0051] [0051] In an additional aspect, in the optional determination block 316, the computing device 100 can be configured to determine when a maximum distortion level of the document image is defined by the calculated distortion factor and the determined distortion function. If the user continues a touch gesture (or other type of user interface input) after the image has been distorted to a maximum degree, it may indicate that the user is actually performing a different type of GUI input (ie , other than dragging, rolling, or moving quickly) for which image distortion is not appropriate. A maximum level of distortion depends on other types of GUI touch gestures (or other type of user interface input) that can be performed and / or assigned to each document image in order to allow the computing device 100 determine the extent to which the document image can be distorted. For example, an email list can have a maximum distortion level of 3 centimeters, so that, once a limit is reached, the computing device 100 can only distort the email items displayed based on the first 3 centimeters of a linear scrolling gesture. If a touch event (or other type of user interface input) caused more than the maximum designated distortion level, computing device 100 could be configured to revert the image back to its original shape, as may be appropriate for another type of touch gesture. If maximum image distortion is achieved (that is, the optional determination block 316 = "Yes"), the computing device 100 can revert the display image to its original form in option block 318 before receiving the event and touch next in oxoco ouz. if the maximum distortion level is not reached (ie, the optional determination block 316 = "No"), the computing device 100 can continue with the current distortion level and receive the next touch event in block 302.
[0052] [0052] Figure 4 illustrates a process flow of another method 400 of the aspects to apply a distortion function during a rapid panning or scrolling, as may occur in response to a rapid movement gesture. In block 302 of method 400, a computing device 100 can detect a touch event and determine whether the traced gesture is a fast-moving gesture in determination block 404. If the gesture is not a fast-moving gesture (i.e., determination block 404 = "No"), computing device 100 can proceed with processing method 300 described above with reference to Figure 3 determining whether the touch event (or other type of user interface input) was a touch-up event in determination block 304 (Figure 3). If the gesture is determined to be a fast-moving gesture (i.e., the determination block 404 = "Yes"), the computing device 100 can determine the distance by which the document will be panned or rotated in block 406 The methods for calculating the movement distance of a document or scrolling a list based on a rapid movement gesture are well known and can be used in the 400 method.
[0053] [0053] The computing device can be configured to animate the movement of the document in response to the rapid movement gesture, such as rotating the document in panoramic motion under the display window of the monitor or performing a quick scroll through a list of elements. This animation can be achieved by sequentially determining the next display increment (the next item in the list, for example) and generating a display that includes that increment until the panning or rapid scrolling reaches a predicted end point. The determination of the next display increment can take into account deceleration factors, so that the display appears initially, but becomes slow over time, until the end of panning or fast scrolling is reached. However, on a conventional computing device that does not include scroll bars on the screen, the user may not be aware of when an end of list or document limit has been reached.
[0054] [0054] In order to provide the user with a visual indication that the movement or rapid scrolling of the document has encountered an end or limit of the document, the computing device can test the document as it animates the movement. In the determination block 408, for example, the computing device 100 can determine whether the end of the panning or rapid scrolling has been achieved (i.e., that the document has reached the end of the path indicated by the rapid motion gesture). If the end of panning or scrolling has been reached (ie, determination block 408 = "Yes"), computing device 100 can continue normal processing by detecting the next touch event in block 302. If the end panning or scrolling has not been achieved (that is, determination block 408 = "No"), this means that the animation must continue, so that, in block 410, computing device 100 can determine the increment of next display. As mentioned above, determining the next display increment can include determining the speed of panning or scrolling based on factors such as the speed printed by the fast-motion gesture and the simulated friction that imposes a deceleration effect on the animation. Once the next display increment has been determined, in determination block 414 the computing device 100 can determine whether the document limit is reached in that display increment. For example, computing device 100 can determine whether the next display increment includes the end of an email list. If the document limit has not been reached (that is, determination block 414 = "No"), computing device 100 can generate a display showing the next display increment in block 412 and return to block 408 to determine again if the end of panning or fast scrolling has been reached.
[0055] [0055] If the document limit has been reached in the next display increment (that is, determination block 414 = "Yes"), the computing device 100 can calculate the distortion factor to be applied to the display image in a distortion animation by continuous contact in block 313. The distortion factor can be based on the speed of panning or fast scrolling when the limit is encountered. Thus, if the limit is found immediately after a rapid movement gesture when the document is animated as moving quickly, the calculated distortion factor may be greater and, thus, the amount of image distortion (the degree of stretching or amount of distortion). continuous contact, for example) may be more pronounced than if the limit were found near the end of the fast panning or scrolling animation when the document appears to be moving slowly. As mentioned above, the continuous contact distortion animation can include one or more continuous contacts and one or more image stretching or compression cycles. In block 416, computing device 100 can apply a continuous contact distortion animation to the display image according to the calculated distortion factor. Once the continuous contact animation has ended, the computing device can return to the normal GUI function by receiving the next touch event in block 302.
[0056] [0056] As would be understood by those skilled in the art, methods 300 and 400 illustrated in Figures 3 and 4 represent examples of how the image distortion functions of the various aspects can be implemented in a computing device. Any number of variations of the processes can be implemented without abandoning the scope of the claims.
[0057] [0057] The aspects described above can be implemented in any one of several computing devices 100. Typically, such computing devices 100 will have in common the components shown in Figure 5. For example, computing devices 100 can include a processor 191 coupled to internal memory 192 and a touch surface input device 104 or screen. The touch surface input device 104 can be any touchscreen, infrared detector touchscreen, acoustic / piezoelectric detection touchscreen or the like. The various aspects are not limited to any specific type of touch screen 101 or touch table technology. In addition, computing device 100 may have an antenna 194 for sending and receiving electromagnetic radiation that is connected to a wireless data link and / or cell phone transceiver 195 attached to processor 191. Computing devices 100 that do not include an input device with a touchscreen 104 (and thus typically does not have a monitor 102) usually does include a keyboard 196 or miniature keyboard, and menu selection keys or swing switches 197, which function as indicating devices. The 191 processor can also be connected to a 198 wired network interface, such as a universal serial bus (USB) or FireWire® connector socket, to connect the 191 processor to a touch-sensitive table or touch-sensitive surface on which the different aspects can also be applied.
[0058] [0058] In some implementations, a touch sensitive surface may be displayed in areas of computing device 100 outside touch screen 104 or monitor 102. For example, keyboard 196 may include a touch sensitive surface with sensors sensitive to touch buried capacitors. In other implementations, the keyboard 196 can be eliminated so that the touch screen 104 provides the complete GUI. In still other implementations, the touch-sensitive surface can be an external touch-sensitive table that can be connected to computing device 100 via a cable connector 198, or by a wireless transceiver (the 195 transceiver, for example) coupled to processor 191.
[0059] [0059] The aspects described above can also be implemented within several computing devices, such as the laptop computer 600 illustrated in Figure 6. Many laptop computers include a touch sensitive surface that functions as the computer's pointing device and thus can receive gestures drag, scroll and fast movement similar to those implemented in mobile devices equipped with a touch screen. A laptop computer 600 will typically include a processor 601 coupled with a volatile memory 602 and a large capacity non-volatile memory, such as a disk drive 603. Computer 600 may also include a floppy disk drive 604 and a disk drive compact (CD) 605 attached to the 601 processor. The computer device 600 can also include multiple connector ports attached to the 601 processor to establish data connections or receive external memory devices, such as a USB or FireWire® connector sockets or others connection circuits with network 606 to couple the processor 601 to a network. In a notebook configuration, the computer case includes the touch pad 607, keyboard 608 and monitor 609, all attached to the 601 processor. Other computing device configurations may include a computer mouse or trackball attached to the processor ( through a USB port, for example), as they are notoriously known.
[0060] [0060] The processor 191, 601 of the computing device can be any microprocessor, programmable microcomputer or multi-processor chip or chips that can be configured by software instructions (applications) to perform various functions, including the functions of the various aspects described above . In some portable computing devices 100, multiple processors 191 may be provided, such as, for example, a processor dedicated to wireless communication functions and a processor dedicated to running other applications. The processor can also be included as part of a set of communication chips.
[0061] [0061] The method descriptions and diagrams of the preceding process flows are presented as illustrative examples only and are not intended to require or imply that the processes of the various aspects must be executed in the order presented. As will be understood by those skilled in the art, the order of the blocks and processes under the preceding aspects can be carried out in any one. Words such as "next", "then", "next (s)", etc., are not intended to limit the order of processes; these words are used simply to guide the reader through the description of the methods. In addition, any reference to claim elements in the singular, using the articles "one (a)" or "o (s) to (s)", for example, should not be interpreted as limiting the elements to the singular.
[0062] [0062] The various blocks, modules, circuits and illustrative logic algorithm processes described in connection with the aspects disclosed herein can be implemented as electronic hardware, computer software or combinations of both. To clearly illustrate this interchangeability of hardware and software, several components, blocks, modules, circuits and illustrative algorithms have been described above generically in terms of their functionality. Whether such functionality is implemented as hardware or software depends on the specific application and the design limitations imposed on the system as a whole. Those skilled in the art can implement the functionality described in a variety of ways for each specific application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
[0063] [0063] The hardware used to implement the various logics, blocks, modules and logic circuits described in connection with the aspects disclosed here can be implemented or executed with a general purpose processor, a digital signal processor (DSP), an integrated circuit application specific (ASIC), a field programmable port arrangement (FPGA) or other programmable logic device, discrete or transistor logic port, discrete hardware components or any combination of them designed to perform the functions described here. A general purpose processor can be a microprocessor, but alternatively the processor can be any conventional processor, controller, microcontroller or state machine. A processor can also be implemented as a combination of computing devices, such as, for example, a combination of DSP and microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a DSP core or any other such configuration. Alternatively, some processes or methods can be performed by a set of circuits that is specific to a given function.
[0064] [0064] Under one or more exemplary aspects, the functions described can be implemented in hardware, software, firmware or any combination of them. If implemented in software, functions can be stored in or transmitted via one or more instructions or code in a computer-readable medium. The processes of a method or algorithm disclosed herein can be embodied in an executable software module per executed processor that can reside in a computer-readable medium. Computer-readable media includes both computer storage media and communication media that include any medium that facilitates the transfer of a computer program from one place to another. A storage medium can be any available medium that can be accessed by a computer. By way of example, and not by way of limitation, such a computer-readable medium may comprise RAM, ROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices or any other medium that may be used. used to carry or store desired program code in the form of instructions or data structures and which can be accessed by a general purpose computer or for special purposes. In addition, any connection is appropriately referred to as a computer-readable medium. For example, if the software is transmitted from a website, server or other remote source using a coaxial cable, fiber optic cable, twisted pair, digital subscriber line (DSL) or wireless technologies such as infrared, radio and microwave, then coaxial cable, fiber optic cable, twisted pair, DSL or wireless technologies such as infrared, radio and microwave are included in the media definition. The term disc (disk and disc), as used herein, includes compact disc (CD), laser disc, optical disc, digital versatile disc (DVD), floppy disc and blu-ray disc, in which usually discs (disks) reproduce data magnetically, while discs (discs) reproduce data optically with lasers. Combinations of them should also be included within the scope of computer-readable media. In addition, method or algorithm operations may reside as one or any combination or set of codes and / or instructions stored in a machine-readable medium or a computer-readable medium that can be incorporated into a computer program product.
[0065] [0065] The foregoing description of the various aspects is provided to allow anyone skilled in the art to manufacture or use the present invention. Several changes in these aspects will be readily apparent to those skilled in the art, and the generic principles defined herein can be applied to other aspects without abandoning the scope of the invention. Thus, the present invention is not intended to be limited to the aspects shown here, but should receive the widest scope compatible with the principles and unpublished aspects disclosed here.
权利要求:
Claims (10)
[0001]
Method to implement a user interface function in a computing device (100), characterized by comprising: detecting an entry by movement gesture (106) of document image in a user interface device of the computing device (100); determining a document image movement of a document displayed on the computing device (100) based on motion gesture input (106); determine if a document limit is reached based on the determined movement of the document image; determine a distortion function to be applied to distort the document image; calculating a distortion factor based on a distance that the input by document image movement gesture (106) would move the document image after the limit is reached; and distort an image of the document displayed on the computing device (100) when a limit of the document image is reached by inserting a space (112) between display image elements based on the distortion function determined using the calculated distortion factor.
[0002]
Method according to claim 1, characterized in that distorting the document image includes distorting the document image along an axis selected from a horizontal axis, a vertical axis, a diagonal axis and an axis both horizontal and vertical.
[0003]
Method according to claim 1, characterized by: the user interface device is a touch surface (104); and the movement gesture input (106) of the document image is a touch event detected on the touch surface (104).
[0004]
Method according to claim 3, characterized in that the touch surface (104) is a touch sensitive screen (102).
[0005]
Method according to claim 3, characterized in that distorting the document image comprises distorting the document image based on a distance that the touch event travels after the document limit is reached.
[0006]
Method according to claim 3, characterized in that it further comprises: start a quick panning or scrolling animation if the touch event represents a fast-moving gesture; determine whether the end of the fast panning or scrolling animation has been reached; and animate document image distortion if the document image limit is reached before the end of fast panning or scrolling animation is reached.
[0007]
Method according to claim 6, characterized in that animating the distortion of the document image comprises animating a continuous contact movement of the document image.
[0008]
Method according to claim 1, characterized in that it further comprises: determine whether a maximum level of display image distortion is achieved; and revert the display image back to its original shape if the maximum level of distortion of the display image is reached.
[0009]
Computing device (100), characterized by comprising: mechanisms for displaying a document image; mechanisms for detecting an entry by movement gesture (106) of document image; mechanisms for determining a document image movement of a document displayed in the mechanisms for displaying a document image based on motion gesture input (106); mechanisms for determining whether a document boundary is reached based on the determined movement of the document image; mechanisms for determining a distortion function to be applied to distort the document image; mechanisms for calculating a distortion factor based on a distance that input by the document image movement gesture (106) would move the document image after the limit is reached; and mechanisms for distorting an image of the document displayed in the mechanisms for displaying a document image when a limit of the document image is reached by inserting a space (112) between display image elements based on the distortion function determined using the distortion factor calculated.
[0010]
Computer-readable memory characterized by comprising instructions stored in it that, when executed, cause a computer's processor to perform the method as defined in any one of claims 1 to 8.
类似技术:
公开号 | 公开日 | 专利标题
BR112012008792B1|2021-03-23|CONTENT LIMIT SIGNALING TECHNIQUES
US10338736B1|2019-07-02|Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
BR102013016792B1|2021-02-23|CONTROL METHOD BASED ON TOUCH AND GESTURE INPUT AND TERMINAL FOR THAT
BR112013026162A2|2020-10-27|method and apparatus for intuitive scrolling of lists in a user interface
US9417779B2|2016-08-16|Scrollable area multi-scale viewing
US20140325455A1|2014-10-30|Visual 3d interactive interface
BRPI0908274B1|2020-02-11|Computer readable means, computer device for interpreting an ambiguous touch event in relation to a touchscreen and method for interpreting an ambiguous touch event in relation to one or more click targets associated with an application
US20120284668A1|2012-11-08|Systems and methods for interface management
TW201344525A|2013-11-01|Touch control electrical device and method for storing information of page displayed thereon
同族专利:
公开号 | 公开日
US20110090255A1|2011-04-21|
EP2488934A2|2012-08-22|
BR112012008792A2|2020-09-15|
KR20120093963A|2012-08-23|
WO2011046766A3|2011-07-21|
WO2011046766A2|2011-04-21|
US8633949B2|2014-01-21|
US8624925B2|2014-01-07|
JP2013508812A|2013-03-07|
CN102597944A|2012-07-18|
US20130222312A1|2013-08-29|
EP3605296A1|2020-02-05|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题

US7000199B2|2001-05-09|2006-02-14|Fairisaac And Company Inc.|Methodology for viewing large strategies via a computer workstation|
JP3761165B2|2002-05-13|2006-03-29|株式会社モバイルコンピューティングテクノロジーズ|Display control device, portable information terminal device, program, and display control method|
US7317449B2|2004-03-02|2008-01-08|Microsoft Corporation|Key-based advanced navigation techniques|
US20070132789A1|2005-12-08|2007-06-14|Bas Ording|List scrolling in response to moving contact over list of index symbols|
US7844915B2|2007-01-07|2010-11-30|Apple Inc.|Application programming interfaces for scrolling operations|
US7469381B2|2007-01-07|2008-12-23|Apple Inc.|List scrolling and document translation, scaling, and rotation on a touch-screen display|
US9569088B2|2007-09-04|2017-02-14|Lg Electronics Inc.|Scrolling method of mobile terminal|
US8624925B2|2009-10-16|2014-01-07|Qualcomm Incorporated|Content boundary signaling techniques|JP2751439B2|1989-08-04|1998-05-18|神鋼電機株式会社|Furnace wall cooling mechanism of induction melting furnace|
US10298834B2|2006-12-01|2019-05-21|Google Llc|Video refocusing|
US7469381B2|2007-01-07|2008-12-23|Apple Inc.|List scrolling and document translation, scaling, and rotation on a touch-screen display|
EP2045700A1|2007-10-04|2009-04-08|LG Electronics Inc.|Menu display method for a mobile communication terminal|
US9083814B2|2007-10-04|2015-07-14|Lg Electronics Inc.|Bouncing animation of a lock mode screen in a mobile communication terminal|
TWI412963B|2009-07-01|2013-10-21|Htc Corp|Data display and movement methods and systems, and computer program products thereof|
KR101588242B1|2009-07-13|2016-01-25|삼성전자주식회사|Apparatus and method for scroll of a portable terminal|
US8624925B2|2009-10-16|2014-01-07|Qualcomm Incorporated|Content boundary signaling techniques|
JP4850277B2|2009-10-28|2012-01-11|株式会社ソニー・コンピュータエンタテインメント|Scroll display program, apparatus and method, and electronic terminal equipped with scroll display apparatus|
US8812985B2|2009-10-30|2014-08-19|Motorola Mobility Llc|Method and device for enhancing scrolling operations in a display device|
US8839128B2|2009-11-25|2014-09-16|Cooliris, Inc.|Gallery application for content viewing|
US8799816B2|2009-12-07|2014-08-05|Motorola Mobility Llc|Display interface and method for displaying multiple items arranged in a sequence|
DE102009058727A1|2009-12-17|2011-06-22|Bayerische Motoren Werke Aktiengesellschaft, 80809|Method and computer for displaying information on a display device of a vehicle|
US20110161892A1|2009-12-29|2011-06-30|Motorola-Mobility, Inc.|Display Interface and Method for Presenting Visual Feedback of a User Interaction|
US9417787B2|2010-02-12|2016-08-16|Microsoft Technology Licensing, Llc|Distortion effects to indicate location in a movable data collection|
US20110225545A1|2010-03-09|2011-09-15|Horodezky Samuel J|System and method of displaying graphical user interface objects|
US8595645B2|2010-03-11|2013-11-26|Apple Inc.|Device, method, and graphical user interface for marquee scrolling within a display area|
US8448084B2|2010-04-08|2013-05-21|Twitter, Inc.|User interface mechanics|
JP5241038B2|2010-07-01|2013-07-17|パナソニック株式会社|Electronic device, display control method, and program|
US20120026181A1|2010-07-30|2012-02-02|Google Inc.|Viewable boundary feedback|
JP5711479B2|2010-08-17|2015-04-30|キヤノン株式会社|Display control apparatus and control method thereof|
JP5478439B2|2010-09-14|2014-04-23|任天堂株式会社|Display control program, display control system, display control apparatus, and display control method|
JP5160604B2|2010-09-14|2013-03-13|任天堂株式会社|Display control program, display control system, display control apparatus, and display control method|
US8514252B1|2010-09-22|2013-08-20|Google Inc.|Feedback during crossing of zoom levels|
US9235233B2|2010-10-01|2016-01-12|Z124|Keyboard dismissed on closure of device|
US9052800B2|2010-10-01|2015-06-09|Z124|User interface with stacked application management|
JP5668401B2|2010-10-08|2015-02-12|ソニー株式会社|Information processing apparatus, information processing method, and program|
JP5679169B2|2010-10-20|2015-03-04|株式会社ソニー・コンピュータエンタテインメント|Menu display device, menu display control method, and program|
JP5701569B2|2010-10-20|2015-04-15|株式会社ソニー・コンピュータエンタテインメント|Image display device, image display control method, and program|
US9377950B2|2010-11-02|2016-06-28|Perceptive Pixel, Inc.|Touch-based annotation system with temporary modes|
KR20120053430A|2010-11-17|2012-05-25|삼성전자주식회사|Device and method for providing image effect in wireless terminal|
CN102486713B|2010-12-02|2014-12-31|联想有限公司|Display method and electronic device|
US9529866B2|2010-12-20|2016-12-27|Sybase, Inc.|Efficiently handling large data sets on mobile devices|
JP5612459B2|2010-12-24|2014-10-22|京セラ株式会社|Mobile terminal device|
US8762840B1|2011-01-09|2014-06-24|Beamberry Solutions Inc. d/b/a SLG Mobile, Inc.|Elastic canvas visual effects in user interface|
JP5418508B2|2011-01-13|2014-02-19|カシオ計算機株式会社|Electronic device, display control method and program|
WO2012119548A1|2011-03-07|2012-09-13|联想有限公司|Control method, control device, display device, and electronic device|
US8863039B2|2011-04-18|2014-10-14|Microsoft Corporation|Multi-dimensional boundary effects|
US9182897B2|2011-04-22|2015-11-10|Qualcomm Incorporated|Method and apparatus for intuitive wrapping of lists in a user interface|
US20120278754A1|2011-04-29|2012-11-01|Google Inc.|Elastic Over-Scroll|
US9281010B2|2011-05-31|2016-03-08|Samsung Electronics Co., Ltd.|Timeline-based content control method and apparatus using dynamic distortion of timeline bar, and method and apparatus for controlling video and audio clips using the same|
US9035967B2|2011-06-30|2015-05-19|Google Technology Holdings LLC|Method and device for enhancing scrolling and other operations on a display|
US8810533B2|2011-07-20|2014-08-19|Z124|Systems and methods for receiving gesture inputs spanning multiple input devices|
JP5935267B2|2011-09-01|2016-06-15|ソニー株式会社|Information processing apparatus, information processing method, and program|
US9229604B2|2011-09-01|2016-01-05|Sony Corporation|User interface element|
US8842057B2|2011-09-27|2014-09-23|Z124|Detail on triggers: transitional states|
US8624934B2|2011-09-29|2014-01-07|Microsoft Corporation|Dynamic display of icons on a small screen|
JP5999830B2|2011-10-28|2016-09-28|任天堂株式会社|Information processing program, information processing apparatus, information processing system, and information processing method|
CN104854549A|2012-10-31|2015-08-19|三星电子株式会社|Display apparatus and method thereof|
TW201319921A|2011-11-07|2013-05-16|Benq Corp|Method for screen control and method for screen display on a touch screen|
FR2984545A1|2011-12-20|2013-06-21|France Telecom|Method for navigating visual content e.g. text, in smartphone, involves deforming portion of displayed content when end of content lying in movement direction specified by navigation command is displayed on touch screen|
US10872454B2|2012-01-06|2020-12-22|Microsoft Technology Licensing, Llc|Panning animations|
KR20130086409A|2012-01-25|2013-08-02|삼성전자주식회사|Apparatus and method for controlling a scorll boundary action of terminal|
US9477642B2|2012-02-05|2016-10-25|Apple Inc.|Gesture-based navigation among content items|
JP2013200863A|2012-02-23|2013-10-03|Panasonic Corp|Electronic device|
JP2017084404A|2012-02-23|2017-05-18|パナソニックIpマネジメント株式会社|Electronic apparatus|
CN103294391A|2012-02-24|2013-09-11|宏达国际电子股份有限公司|Electronic apparatus and operating method thereof|
FR2987470A1|2012-02-29|2013-08-30|France Telecom|NAVIGATION METHOD WITHIN A DISPLAYABLE CONTENT USING NAVIGATION CONTROLS, NAVIGATION DEVICE AND PROGRAM THEREOF|
KR101872865B1|2012-03-05|2018-08-02|엘지전자 주식회사|Electronic Device And Method Of Controlling The Same|
CN102622178B|2012-03-09|2013-06-12|游图明|Touch screen electronic equipment-based method for warping plane image|
CN103309599A|2012-03-15|2013-09-18|华为终端有限公司|Touch screen sliding finding method and touch screen equipment|
US20130278603A1|2012-04-20|2013-10-24|Tuming You|Method, Electronic Device, And Computer Readable Medium For Distorting An Image On A Touch Screen|
GB2503654B|2012-06-27|2015-10-28|Samsung Electronics Co Ltd|A method and apparatus for outputting graphics to a display|
US9626090B2|2012-06-29|2017-04-18|Google Inc.|Systems and methods for scrolling through content displayed on an electronic device|
JP5994543B2|2012-10-03|2016-09-21|コニカミノルタ株式会社|Display system, display device, and display control program|
WO2014063834A1|2012-10-22|2014-05-01|Telefónica, S.A.|A computed implemented method and electronic device for providing visual feedback to a user when the edge of an object has been reached|
JP6081769B2|2012-10-23|2017-02-15|任天堂株式会社|Program, information processing apparatus, information processing method, and information processing system|
US9081410B2|2012-11-14|2015-07-14|Facebook, Inc.|Loading content on electronic device|
JP6185707B2|2012-11-16|2017-08-23|任天堂株式会社|Program, information processing apparatus, information processing system, and information processing method|
US20140152585A1|2012-12-04|2014-06-05|Research In Motion Limited|Scroll jump interface for touchscreen input/output device|
EP2741195A1|2012-12-07|2014-06-11|BlackBerry Limited|Methods and devices for scrolling a display page|
US9082348B2|2012-12-07|2015-07-14|Blackberry Limited|Methods and devices for scrolling a display page|
JP2014139776A|2012-12-19|2014-07-31|Canon Inc|Display controller, display control method, and program|
US10691230B2|2012-12-29|2020-06-23|Apple Inc.|Crown input for a wearable electronic device|
US9961721B2|2013-01-17|2018-05-01|Bsh Home Appliances Corporation|User interface for oven: info mode|
US20140201688A1|2013-01-17|2014-07-17|Bsh Home Appliances Corporation|User interface - gestural touch|
US9554689B2|2013-01-17|2017-01-31|Bsh Home Appliances Corporation|User interface—demo mode|
TW201433971A|2013-02-20|2014-09-01|Phoenix Tech Ltd|Method of indicating an edge of an electronic document|
US10334151B2|2013-04-22|2019-06-25|Google Llc|Phase detection autofocus using subaperture images|
US9329764B2|2013-03-15|2016-05-03|Google Inc.|Overscroll visual effects|
US9459705B2|2013-03-18|2016-10-04|Facebook, Inc.|Tilting to scroll|
JP2014182638A|2013-03-19|2014-09-29|Canon Inc|Display control unit, display control method and computer program|
US9940014B2|2013-05-03|2018-04-10|Adobe Systems Incorporated|Context visual organizer for multi-screen display|
WO2014192535A1|2013-05-27|2014-12-04|Necカシオモバイルコミュニケーションズ株式会社|Display control device, control method thereof, and program|
US20160132204A1|2013-05-27|2016-05-12|Nec Corporation|Information processing apparatus, processing method thereof, and program|
CN104238783B|2013-06-07|2017-09-12|阿里巴巴集团控股有限公司|The control method and device of a kind of touch-screen|
JP6155869B2|2013-06-11|2017-07-05|ソニー株式会社|Display control apparatus, display control method, and program|
JP6608576B2|2013-06-26|2019-11-20|京セラ株式会社|Electronic device and display control method|
KR102234400B1|2013-07-08|2021-03-31|삼성전자주식회사|Apparatas and method for changing the order or the position of list in an electronic device|
US9274701B2|2013-07-10|2016-03-01|Nvidia Corporation|Method and system for a creased paper effect on page limits|
KR102186548B1|2013-08-21|2020-12-07|삼성전자주식회사|Method, apparatus and recovering medium for screen display by executing scroll|
US10545657B2|2013-09-03|2020-01-28|Apple Inc.|User interface for manipulating user interface objects|
US11068128B2|2013-09-03|2021-07-20|Apple Inc.|User interface object manipulations in a user interface|
US20150089454A1|2013-09-25|2015-03-26|Kobo Incorporated|Overscroll stretch animation|
US8869062B1|2013-11-27|2014-10-21|Freedom Scientific, Inc.|Gesture-based screen-magnified touchscreen navigation|
US20150169196A1|2013-12-13|2015-06-18|Samsung Electronics Co., Ltd.|Method and apparatus for controlling an electronic device screen|
US10089787B2|2013-12-26|2018-10-02|Flir Systems Ab|Systems and methods for displaying infrared images|
EP3584671A1|2014-06-27|2019-12-25|Apple Inc.|Manipulation of calendar application in device with touch screen|
KR20160020738A|2014-08-14|2016-02-24|삼성전자주식회사|Electronic Device And Method For Providing User Interface Of The Same|
TW201610758A|2014-09-02|2016-03-16|蘋果公司|Button functionality|
CN106797493A|2014-09-02|2017-05-31|苹果公司|Music user interface|
CN105630335A|2014-11-07|2016-06-01|华硕电脑股份有限公司|Touch screen operation method and electronic device|
KR20160071869A|2014-12-12|2016-06-22|삼성전자주식회사|A display apparatus and a display method|
EP3032393B1|2014-12-12|2020-07-08|Samsung Electronics Co., Ltd|Display apparatus and display method|
JP5860526B2|2014-12-24|2016-02-16|株式会社ソニー・コンピュータエンタテインメント|Menu display device, menu display control method, and program|
KR102252321B1|2014-12-24|2021-05-14|삼성전자주식회사|A display apparatus and a display method|
KR20160084240A|2015-01-05|2016-07-13|삼성전자주식회사|A display apparatus and a display method|
KR102351317B1|2015-01-07|2022-01-14|삼성전자 주식회사|Method for displaying an electronic document and electronic device|
CN104898968A|2015-01-30|2015-09-09|小米科技有限责任公司|Method and device for displaying document on touch control display screen|
CN104850340B|2015-01-30|2018-11-30|小米科技有限责任公司|Document display method and device on touching display screen|
CN104636067A|2015-01-30|2015-05-20|小米科技有限责任公司|Method and device for displaying document on touch display screen|
US20160253837A1|2015-02-26|2016-09-01|Lytro, Inc.|Parallax bounce|
US10365807B2|2015-03-02|2019-07-30|Apple Inc.|Control of system zoom magnification using a rotatable input mechanism|
US10341632B2|2015-04-15|2019-07-02|Google Llc.|Spatial random access enabled video system with a three-dimensional viewing volume|
US10565734B2|2015-04-15|2020-02-18|Google Llc|Video capture, processing, calibration, computational fiber artifact removal, and light-field pipeline|
US10567464B2|2015-04-15|2020-02-18|Google Llc|Video compression with adaptive view-dependent lighting removal|
US10546424B2|2015-04-15|2020-01-28|Google Llc|Layered content delivery for virtual and augmented reality experiences|
US10540818B2|2015-04-15|2020-01-21|Google Llc|Stereo image generation and interactive playback|
US10412373B2|2015-04-15|2019-09-10|Google Llc|Image capture for virtual reality displays|
US10419737B2|2015-04-15|2019-09-17|Google Llc|Data structures and delivery methods for expediting virtual reality playback|
US10275898B1|2015-04-15|2019-04-30|Google Llc|Wedge-based light-field video capture|
US10469873B2|2015-04-15|2019-11-05|Google Llc|Encoding and decoding virtual reality video|
JP6489214B2|2015-06-05|2019-03-27|京セラドキュメントソリューションズ株式会社|Display device and display control method|
US9979909B2|2015-07-24|2018-05-22|Lytro, Inc.|Automatic lens flare detection and correction for light-field images|
US9858649B2|2015-09-30|2018-01-02|Lytro, Inc.|Depth-based image blurring|
JP6046226B2|2015-10-08|2016-12-14|京セラ株式会社|Portable terminal device, program, and display control method|
CN106855796A|2015-12-09|2017-06-16|阿里巴巴集团控股有限公司|A kind of data processing method, device and intelligent terminal|
CN106855778A|2015-12-09|2017-06-16|阿里巴巴集团控股有限公司|The processing method of interface operation, device and intelligent terminal|
US10296088B2|2016-01-26|2019-05-21|Futurewei Technologies, Inc.|Haptic correlated graphic effects|
CN105786358A|2016-02-26|2016-07-20|华为技术有限公司|Terminal display interface dragging method and device|
US10275892B2|2016-06-09|2019-04-30|Google Llc|Multi-view scene segmentation and propagation|
JP6455489B2|2016-06-20|2019-01-23|京セラドキュメントソリューションズ株式会社|Display device and display control program|
CN107807775B|2016-09-09|2021-08-03|佳能株式会社|Display control device, control method thereof, and storage medium storing control program thereof|
US10679361B2|2016-12-05|2020-06-09|Google Llc|Multi-view rotoscope contour propagation|
TWI724096B|2017-01-23|2021-04-11|香港商斑馬智行網絡(香港)有限公司|Processing method, device and smart terminal for interface operation|
US10594945B2|2017-04-03|2020-03-17|Google Llc|Generating dolly zoom effect using light field image data|
CN108803969B|2017-05-03|2021-05-07|腾讯科技(深圳)有限公司|Information list display method, application terminal and storage device|
US10474227B2|2017-05-09|2019-11-12|Google Llc|Generation of virtual reality with 6 degrees of freedom from limited viewer data|
US10444931B2|2017-05-09|2019-10-15|Google Llc|Vantage generation and interactive playback|
US10440407B2|2017-05-09|2019-10-08|Google Llc|Adaptive control for immersive experience delivery|
US10354399B2|2017-05-25|2019-07-16|Google Llc|Multi-view back-projection to a light-field|
US10545215B2|2017-09-13|2020-01-28|Google Llc|4D camera tracking and optical stabilization|
CN107977150A|2017-10-31|2018-05-01|阿里巴巴集团控股有限公司|A kind of view scrolling method, device and electronic equipment|
US10965862B2|2018-01-18|2021-03-30|Google Llc|Multi-camera navigation interface|
BR102018017046A2|2018-08-20|2020-03-10|Samsung Eletrônica da Amazônia Ltda.|METHOD FOR CONTROL OF EXECUTION OF THE ANIMATION PROCESS IN ELECTRONIC DEVICES|
DK179888B1|2018-09-11|2019-08-27|Apple Inc.|CONTENT-BASED TACTICAL OUTPUTS|
CN112596648A|2020-12-21|2021-04-02|北京百度网讯科技有限公司|Page processing method and device, electronic equipment and readable storage medium|
CN112882636A|2021-02-18|2021-06-01|上海哔哩哔哩科技有限公司|Picture processing method and device|
法律状态:
2020-09-29| B06U| Preliminary requirement: requests with searches performed by other patent offices: procedure suspended [chapter 6.21 patent gazette]|
2020-10-06| B06F| Objections, documents and/or translations needed after an examination request according [chapter 6.6 patent gazette]|
2021-01-12| B09A| Decision: intention to grant [chapter 9.1 patent gazette]|
2021-03-23| B16A| Patent or certificate of addition of invention granted [chapter 16.1 patent gazette]|Free format text: PRAZO DE VALIDADE: 10 (DEZ) ANOS CONTADOS A PARTIR DE 23/03/2021, OBSERVADAS AS CONDICOES LEGAIS. |
优先权:
申请号 | 申请日 | 专利标题
US12/580,956|2009-10-16|
US12/580,956|US8624925B2|2009-10-16|2009-10-16|Content boundary signaling techniques|
PCT/US2010/051280|WO2011046766A2|2009-10-16|2010-10-04|Content boundary signaling techniques|
[返回顶部]